A Probabilistic Expectation Maximization Algorithm for Multivariate Laplacian Mixtures
نویسنده
چکیده
Estimating parameters of mixture models is a typical application of the expectation maximization (EM) algorithm. For the family of multivariate exponential power (MEP) distributions, which is a generalization of the well known multivariate Gaussian distribution, we introduce an approximative EM algorithm, and a probabilistic variant called stochastic EM algorithm, which provides a significant speedup. We prove that, with high probability, single update steps of the probabilistic variant do not differ much from the deterministic solution, assuming certain properties of the input data set. A set of practical experiments shows that this holds true, even over multiple successive iterations.
منابع مشابه
Laplacian Mixture Modeling for Overcomplete Mixing Matrix in Wavelet Packet Domain by Adaptive EM-type Algorithm and Comparisons
Speech process has benefited a great deal from the wavelet transforms. Wavelet packets decompose signals in to broader components using linear spectral bisecting. In this paper, mixtures of speech signals are decomposed using wavelet packets, the phase difference between the two mixtures are investigated in wavelet domain. In our method Laplacian Mixture Model (LMM) is defined. An Expectation M...
متن کاملIndependent Vector Analysis for Source Separation Using a Mixture of Gaussians Prior
Convolutive mixtures of signals, which are common in acoustic environments, can be difficult to separate into their component sources. Here we present a uniform probabilistic framework to separate convolutive mixtures of acoustic signals using independent vector analysis (IVA), which is based on a joint distribution for the frequency components originating from the same source and is capable of...
متن کاملProbabilistic Model-based Clustering of Multivariate and Sequential Data
Probabilistic model-based clustering, based on nite mixtures of multivariate models, is a useful framework for clustering data in a statistical context. This general framework can be directly extended to clustering of sequential data, based on nite mixtures of sequential models. In this paper we consider the problem of tting mixture models where both multivariate and sequential observations are...
متن کاملStochastic approximation learning for mixtures of multivariate elliptical distributions
Most of current approaches to mixture modeling consider mixture components from a few families of probability distributions, in particular from the Gaussian family. The reasons of these preferences can be traced to their training algorithms, typically versions of the Expectation-Maximization (EM) method. The reestimation equations needed by this method become very complex as the mixture compone...
متن کاملMultivariate Structural Bernoulli Mixtures for Recognition of Handwritten Numerals
As shown recently, the structural optimization of probabilistic neural networks can be included into EM algorithm by introducing a special type of multivariate Bernoulli mixtures. However, the underlying loglikelihood criterion is known to be multimodal in case of mixtures and therefore the EM iteration process may be starting-point dependent. In the present paper we discuss the possibility of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2014